Functional magnetic resonance imaging (fMRI) was used to measure brain activity in nine healthy adults while they listened to and read more than two hours of stories from The Moth Radio Hour. This data was used to estimate quantitative models that predict brain activity in each voxel (volumetric pixel) based on the meaning of the words (i.e. semantics) in the stories. We used these models to create functional maps across the cerebral cortex that show how semantic concepts are represented in the brain when participants listen to and read stories. Read the paper describing this research here.
In this interactive viewer you can explore predictive models fit to one subject's brain. Colors show the category of words predicted to elicit the largest response in each voxel (legend, bottom left).
Click and drag brain to rotate. Scroll to zoom. Press 'F' vor a flattened cortical view, press 'L' to view region labels, press 'I' for an inflated brain view. Click voxel to see more detail. Click 'Next' to begin a short tour. If you have problems email or github might help.
Research by Fatma Deniz, Anwar Nunez-Elizalde, Alexander Huth, and Jack Gallant.
Brain viewer by Fatma Deniz, made with pycortex software by James Gao, Mark Lescroart, and Alexander Huth.
Data is available in Deniz et al. 2019 fMRI and stimulus data.